-
1 средняя условная энтропия
Programming: average conditional entropy (условная энтропия, усредненная по всем исходам другой ситуации; см. Теория передачи информации. Терминология. Вып. 94. М.: Наука, 1979)Универсальный русско-английский словарь > средняя условная энтропия
-
2 средняя энтропия
1. average entropy2. mean entropyРусско-английский большой базовый словарь > средняя энтропия
-
3 bedingte Entropie
Entropie f: bedingte Entropie f KO conditional entropy, average conditional information content, equivocation, remaining uncertainty (Größe der Informationstheorie; Hy (x))Deutsch-Englisch Wörterbuch der Elektrotechnik und Elektronik > bedingte Entropie
-
4 средняя условная дифференциальная энтропия
General subject: average conditional differential entropy (условная дифференциальная энтропия, усредненная по всем исходам другой ситуации; см. Теория передачи информации. Терминология. Вып. 94. М.: Наука, 1979)Универсальный русско-английский словарь > средняя условная дифференциальная энтропия
См. также в других словарях:
Entropy (information theory) — In information theory, entropy is a measure of the uncertainty associated with a random variable. The term by itself in this context usually refers to the Shannon entropy, which quantifies, in the sense of an expected value, the information… … Wikipedia
Cross entropy — In information theory, the cross entropy between two probability distributions measures the average number of bits needed to identify an event from a set of possibilities, if a coding scheme is used based on a given probability distribution q,… … Wikipedia
Differential entropy — (also referred to as continuous entropy) is a concept in information theory that extends the idea of (Shannon) entropy, a measure of average surprisal of a random variable, to continuous probability distributions. Contents 1 Definition 2… … Wikipedia
Maximum entropy probability distribution — In statistics and information theory, a maximum entropy probability distribution is a probability distribution whose entropy is at least as great as that of all other members of a specified class of distributions. According to the principle of… … Wikipedia
Von Neumann entropy — In quantum statistical mechanics, von Neumann entropy refers to the extension of classical entropy concepts to the field of quantum mechanics.John von Neumann rigorously established the correct mathematical framework for quantum mechanics with… … Wikipedia
Information entropy — Während ordinale Insolvenzprognosen lediglich eine Reihung von Unternehmen entsprechend den erwarteten Ausfallwahrscheinlichkeiten vornehmen, ordnen kardinale Insolvenzprognosen jedem Unternehmen explizit eine Ausfallwahrscheinlichkeit zu.[1]… … Deutsch Wikipedia
Information theory — Not to be confused with Information science. Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental… … Wikipedia
Kullback–Leibler divergence — In probability theory and information theory, the Kullback–Leibler divergence[1][2][3] (also information divergence, information gain, relative entropy, or KLIC) is a non symmetric measure of the difference between two probability distributions P … Wikipedia
Quantities of information — A simple information diagram illustrating the relationships among some of Shannon s basic quantities of information. The mathematical theory of information is based on probability theory and statistics, and measures information with several… … Wikipedia
Mutual information — Individual (H(X),H(Y)), joint (H(X,Y)), and conditional entropies for a pair of correlated subsystems X,Y with mutual information I(X; Y). In probability theory and information theory, the mutual information (sometimes known by the archaic term… … Wikipedia
Indus script — Type Undeciphered Bronze Age writing … Wikipedia